92 research outputs found
A Size-Free CLT for Poisson Multinomials and its Applications
An -Poisson Multinomial Distribution (PMD) is the distribution of the
sum of independent random vectors supported on the set of standard basis vectors in . We show
that any -PMD is -close in total
variation distance to the (appropriately discretized) multi-dimensional
Gaussian with the same first two moments, removing the dependence on from
the Central Limit Theorem of Valiant and Valiant. Interestingly, our CLT is
obtained by bootstrapping the Valiant-Valiant CLT itself through the structural
characterization of PMDs shown in recent work by Daskalakis, Kamath, and
Tzamos. In turn, our stronger CLT can be leveraged to obtain an efficient PTAS
for approximate Nash equilibria in anonymous games, significantly improving the
state of the art, and matching qualitatively the running time dependence on
and of the best known algorithm for two-strategy anonymous
games. Our new CLT also enables the construction of covers for the set of
-PMDs, which are proper and whose size is shown to be essentially
optimal. Our cover construction combines our CLT with the Shapley-Folkman
theorem and recent sparsification results for Laplacian matrices by Batson,
Spielman, and Srivastava. Our cover size lower bound is based on an algebraic
geometric construction. Finally, leveraging the structural properties of the
Fourier spectrum of PMDs we show that these distributions can be learned from
samples in -time, removing
the quasi-polynomial dependence of the running time on from the
algorithm of Daskalakis, Kamath, and Tzamos.Comment: To appear in STOC 201
On the Structure, Covering, and Learning of Poisson Multinomial Distributions
An -Poisson Multinomial Distribution (PMD) is the distribution of the
sum of independent random vectors supported on the set of standard basis vectors in . We prove
a structural characterization of these distributions, showing that, for all
, any -Poisson multinomial random vector is
-close, in total variation distance, to the sum of a discretized
multidimensional Gaussian and an independent -Poisson multinomial random vector. Our structural characterization extends
the multi-dimensional CLT of Valiant and Valiant, by simultaneously applying to
all approximation requirements . In particular, it overcomes
factors depending on and, importantly, the minimum eigenvalue of the
PMD's covariance matrix from the distance to a multidimensional Gaussian random
variable.
We use our structural characterization to obtain an -cover, in
total variation distance, of the set of all -PMDs, significantly
improving the cover size of Daskalakis and Papadimitriou, and obtaining the
same qualitative dependence of the cover size on and as the
cover of Daskalakis and Papadimitriou. We further exploit this structure
to show that -PMDs can be learned to within in total
variation distance from samples, which is
near-optimal in terms of dependence on and independent of . In
particular, our result generalizes the single-dimensional result of Daskalakis,
Diakonikolas, and Servedio for Poisson Binomials to arbitrary dimension.Comment: 49 pages, extended abstract appeared in FOCS 201
- …